Fisher Information in Gaussian Graphical Models

نویسنده

  • Jason K. Johnson
چکیده

This note summarizes various derivations, formulas and computational algorithms relevant to the Fisher information matrix of Gaussian graphical models with respect to either an exponential parameterization (related to the information form) or the corresponding moment parameterization. 1 Gauss-Markov Models The probability density of a Gaussian random vector x ∈ R may be expressed in information form: p(x) ∝ exp{− 1 2 xJx+ hx} with parameters h ∈ R and J ∈ R, where J is a symmetric positive-definite matrix. The mean and covariance of x are given by: μ , p[x] = Jh P , p[(x− μ) (x− μ)] = J where we denote expectation of f(x) with respect to p by p[f ] , ∫ p(x)f(x)dx. In this note, we focus on the zero-mean case μ = 0. Thus, h = 0 as well. A Gaussian distribution is Markov on a graph G = (V,E), with vertices V = {1, . . . , n}, if and only if Jij = 0 for all {i, j} 6∈ E. Thus, Markov models have a reduced information parameterization based on a sparse J matrix. 2 Exponential Family and Fisher Information The Gaussian distributions can also be represented as an exponential family : p(x) = exp{θφ(x)− Φ(θ)} with parameters θ ∈ R and sufficient statistics φ : R → R. To obtain the information form we define statistics: φ(x) = (xi , i ∈ V ) ∪ (xixj , {i, j} ∈ E) The corresponding exponential parameters then correspond to elements of the information matrix: θ = (− 1 2 Jii, i ∈ V ) ∪ (−Jij , {i, j} ∈ E)

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fitting Directed Graphical Gaussian Models with One Hidden Variable

We discuss directed acyclic graph (DAG) models to represent the independence structure of linear Gaussian systems with continuous variables: such models can be interpreted as a set of recursive univariate regressions. Then we consider Gaussian models in which one of the variables is not observed and we show how the incomplete log-likelihood of the observed data can be maximized using the EM. As...

متن کامل

Approximate inference in Gaussian graphical models

The focus of this thesis is approximate inference in Gaussian graphical models. A graphical model is a family of probability distributions in which the structure of interactions among the random variables is captured by a graph. Graphical models have become a powerful tool to describe complex high-dimensional systems specified through local interactions. While such models are extremely rich and...

متن کامل

­­Image Segmentation using Gaussian Mixture Model

Abstract: Stochastic models such as mixture models, graphical models, Markov random fields and hidden Markov models have key role in probabilistic data analysis. In this paper, we used Gaussian mixture model to the pixels of an image. The parameters of the model were estimated by EM-algorithm.   In addition pixel labeling corresponded to each pixel of true image was made by Bayes rule. In fact,...

متن کامل

IMAGE SEGMENTATION USING GAUSSIAN MIXTURE MODEL

  Stochastic models such as mixture models, graphical models, Markov random fields and hidden Markov models have key role in probabilistic data analysis. In this paper, we have learned Gaussian mixture model to the pixels of an image. The parameters of the model have estimated by EM-algorithm.   In addition pixel labeling corresponded to each pixel of true image is made by Bayes rule. In fact, ...

متن کامل

Scaling up Natural Gradient by Sparsely Factorizing the Inverse Fisher Matrix

Second-order optimization methods, such as natural gradient, are difficult to apply to highdimensional problems, because they require approximately solving large linear systems. We present FActorized Natural Gradient (FANG), an approximation to natural gradient descent where the Fisher matrix is approximated with a Gaussian graphical model whose precision matrix can be computed efficiently. We ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006